Subtracting a best rank-1 approximation does not necessarily decrease tensor rank
نویسندگان
چکیده
It has been shown that a best rank-R approximation of an order-k tensor may not exist when R ≥ 2 and k ≥ 3. This poses a serious problem to data analysts using tensor decompositions. It has been observed numerically that, generally, this issue cannot be solved by consecutively computing and subtracting best rank-1 approximations. The reason for this is that subtracting a best rank-1 approximation generally does not decrease tensor rank. In this paper, we provide a mathematical treatment of this property for real-valued 2 × 2 × 2 tensors, with symmetric tensors as a special case. Regardless of the symmetry, we show that for generic 2×2×2 tensors (which have rank 2 or 3), subtracting a best rank-1 approximation results in a tensor that has rank 3 and lies on the boundary between the rank-2 and rank-3 sets. Hence, for a typical tensor of rank 2, subtracting a best rank-1 approximation increases the tensor rank.
منابع مشابه
un 2 00 9 Subtracting a best rank - 1 approximation may increase tensor rank
It has been shown that a best rank-R approximation of an order-k tensor may not exist when R ≥ 2 and k ≥ 3. This poses a serious problem to data analysts using tensor decompositions. It has been observed numerically that, generally, this issue cannot be solved by consecutively computing and subtracting best rank-1 approximations. The reason for this is that subtracting a best rank-1 approximati...
متن کاملOn the Tensor Svd and Optimal Low Rank Orthogonal Approximations of Tensors
Abstract. It is known that a high order tensor does not necessarily have an optimal low rank approximation, and that a tensor might not be orthogonally decomposable (i.e., admit a tensor SVD). We provide several sufficient conditions which lead to the failure of the tensor SVD, and characterize the existence of the tensor SVD with respect to the Higher Order SVD (HOSVD) of a tensor. In face of ...
متن کاملOn the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors
In this paper we discuss a multilinear generalization of the best rank-R approximation problem for matrices, namely, the approximation of a given higher-order tensor, in an optimal leastsquares sense, by a tensor that has prespecified column rank value, row rank value, etc. For matrices, the solution is conceptually obtained by truncation of the singular value decomposition (SVD); however, this...
متن کاملBEST APPROXIMATION IN QUASI TENSOR PRODUCT SPACE AND DIRECT SUM OF LATTICE NORMED SPACES
We study the theory of best approximation in tensor product and the direct sum of some lattice normed spacesX_{i}. We introduce quasi tensor product space anddiscuss about the relation between tensor product space and thisnew space which we denote it by X boxtimesY. We investigate best approximation in direct sum of lattice normed spaces by elements which are not necessarily downwardor upward a...
متن کاملComplex Tensors Almost Always Have Best Low-rank Approximations
Low-rank tensor approximations are plagued by a well-known problem — a tensor may fail to have a best rank-r approximation. Over R, it is known that such failures can occur with positive probability, sometimes with certainty: in R2×2×2, every tensor of rank 3 fails to have a best rank-2 approximation. We will show that while such failures still occur over C, they happen with zero probability. I...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015